Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add filters

Database
Language
Document Type
Year range
1.
Progress in Biomedical Optics and Imaging - Proceedings of SPIE ; 12465, 2023.
Article in English | Scopus | ID: covidwho-20234381

ABSTRACT

Although many AI-based scientific works regarding chest X-ray (CXR) interpretation focused on COVID-19 diagnosis, fewer papers focused on other relevant tasks, like severity estimation, deterioration, and prognosis. The same holds for explainable decisions to estimate COVID-19 prognosis as well. The international hackathon launched during Dubai Expo 2020, aimed at designing machine learning solutions to help physicians formulate COVID-19 patients' prognosis, was the occasion to develop a machine learning model capable of predicting such prognoses and justifying them through interpretable explanations. The large hackathon dataset comprised subjects characterized by their CXR and numerous clinical features collected during triage. To calculate the prognostic value, our model considered both patients' CXRs and clinical features. After automatic pre-processing to improve their quality, CXRs were processed by a Deep Learning model to estimate the lung compromise degree, which has been considered as an additional clinical feature. Original clinical parameters suffered from missing values that were adequately handled. We trained and evaluated multiple models to find the best one and fine-tune it before the inference process. Finally, we produced novel explanations, both visual and numerical, to justify the model predictions. Ultimately, our model processes a CXR and several clinical data to estimate a patient's prognosis related to the COVID-19 disease. It proved to be accurate and was ranked second in the final rankings with 75%, 73.9%, and 74.4% in sensitivity, specificity, and balanced accuracy, respectively. In terms of model explainability, it was ranked first since it was agreed to be the most interpretable by health professionals. © 2023 SPIE.

SELECTION OF CITATIONS
SEARCH DETAIL